Chinese-Chinese and English-Chinese Question Answering with ASQA at NTCIR-6 CLQA

نویسندگان

  • Cheng-Wei Lee
  • Min-Yuh Day
  • Cheng-Lung Sung
  • Yi-Hsun Lee
  • Mike Tian-Jian Jiang
  • Chia-Wei Wu
  • Cheng-Wei Shih
  • Yu-Ren Chen
  • Wen-Lian Hsu
چکیده

For NTCIR-6 CLQA, we improved our question answering system ASQA (Academia Sinica Question Answering System), which participated in NTCIR-5 CLQA, so that it could deal with the Chinese-Chinese (C-C) subtask and the English-Chinese (E-C) subtask. There are three innovations in the improved system: (a) to handle the E-C subtask, we have built an English question classifier that adopts Question Informer as a key classification feature; (b) with automatically generated Answer Templates, we can accurately pinpoint the correct answers for some questions. When Answer Templates are applied, the RU-accuracy is 0.911 for the applied questions; and (c) the Answer Ranking module has been improved by incorporating a new feature called, SCO-QAT (Sum of Co-occurrence of Question and Answer Terms). In NTCIR-6 CLQA, ASQA achieved 0.553 RU-accuracy in the C-C subtask and 0.34 RU-accuracy in the E-C subtask.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

ASQA: Academia Sinica Question Answering System for NTCIR-5 CLQA

We propose a hybrid architecture for the NTCIR-5 CLQA C-C (Cross Language Question Answering from Chinese to Chinese) Task. Our system, the Academia Sinica Question-Answering System (ASQA), outputs exact answers to six types of factoid question: personal names, location names, organization names, artifacts, times, and numbers. The architecture of ASQA comprises four main components: Question Pr...

متن کامل

Overview of the NTCIR-6 Cross-Lingual Question Answering (CLQA) Task

This paper describes an overview of the NTCIR-6 Cross-Lingual Question Answering (CLQA) Task, an evaluation campaign for Cross-Lingual Question Answering technology. In NTCIR-5, the first CLQA task targeting Chinese, English, and Japanese languages was carried out. Following the success of NTCIR-5 CLQA, NTCIR-6 hosted the second campaign on the CLQA task. Since the handling of Named Entities is...

متن کامل

Chinese QA and CLQA: NTCIR-5 QA Experiments at UNT

This paper describes our participation in the NTCIR-5 CLQA task. Three runs were officially submitted for three subtasks: Chinese Question Answering, English-Chinese Question Answering, and Chinese-English Question Answering. We expanded our TREC experimental QA system EagleQA this year to include Chinese QA and Cross-Language QA capabilities. Various information retrieval and natural language ...

متن کامل

An Analysis of Question Processing of English and Chinese for the NTCIR 5 Cross-Language Question Answering Task

An important element in question answering systems is the analysis and interpretation of questions. Using the NTCIR 5 Cross-Language Question Answering (CLQA) question test set we demonstrate that the accuracy of deep question analysis is dependent on the quantity and suitability of the available linguistic resources. We further demonstrate that applying question analysis tools developed on mon...

متن کامل

Extracting and Ranking Question-Focused Terms Using the Titles of Wikipedia Articles

At the NTCIR-6 CLQA (Cross-Language Question Answering) task, we participated in the Chinese-Chinese (C-C) and English-Chinese (E-C) QA (Question Answering) subtasks. Without employing question type classification, we proposed a new resource, Wikipedia, to assist in extracting and ranking Question-Focused terms. We regarded the titles of Wikipedia articles as a multilingual noun-phrase corpus w...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2007